Unsupervised Kernel Dimension Reduction Supplemental Material
نویسندگان
چکیده
We start by noting that conditional independence X ⊥ Y |B⊤X does not necessarily imply the correlation between BX and Y is maximized. To see this, let X be a Gaussian random vairable with zero mean and diagonal covariance matrix. AssumeB is an identity matrix and Y = X = (BX) (elementwise square for a vectorial X). The conditional independence is obviously satisfied yet the correlation between BX and Y is zero. This observation is yet another example showing the limitation of Spearson’s correlation measures, which detect only linear dependence between random variables. In the following, we show that when measured in the RKHS, the two measures ĴY Y |X and ĴXY are equivalent. Assume we use Gaussian RBF kernel for both ĴY Y |X(B X,Y ) and ĴXY (B X,Y ): K(xi,xj) = exp ( −‖xi − xj‖/σ N )
منابع مشابه
Unsupervised Kernel Dimension Reduction
We apply the framework of kernel dimension reduction, originally designed for supervised problems, to unsupervised dimensionality reduction. In this framework, kernel-based measures of independence are used to derive low-dimensional representations that maximally capture information in covariates in order to predict responses. We extend this idea and develop similarly motivated measures for uns...
متن کاملUnsupervised Multiple Kernel Learning
Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimens...
متن کاملEvolutionary Unsupervised Kernel Regression
Dimension reduction and manifold learning play an important role in robotics, multimedia processing and data mining. For these tasks strong methods like Unsupervised Kernel Regression [4, 7] or Gaussian Process Latent Variable Models [5, 6] have been proposed in the last years. But many methods suffer from numerous local optima and crucial parameter dependencies. We use advanced methods from st...
متن کاملRegression on manifolds using kernel dimension reduction
We study the problem of discovering a manifold that best preserves information relevant to a nonlinear regression. Solving this problem involves extending and uniting two threads of research. On the one hand, the literature on sufficient dimension reduction has focused on methods for finding the best linear subspace for nonlinear regression; we extend this to manifolds. On the other hand, the l...
متن کاملUnsupervised Nonlinear Feature Extraction Method and Its Effects on Target Detection in High-dimensional Data
The principal component analysis (PCA) is one of the most effective unsupervised techniques for feature extraction. To extract higher order properties of data, researchers extended PCA to kernel PCA (KPCA) by means of kernel machines. In this paper, KPCA is applied as a feature extraction procedure to dimension reduction for target detection as a preprocessing on hyperspectral images. Then the ...
متن کامل